Local Behavior of Sparse Analysis Regularization: Applications to Risk Estimation

نویسندگان

  • Samuel Vaiter
  • Charles-Alban Deledalle
  • Gabriel Peyré
  • Charles Dossal
  • Jalal Fadili
چکیده

In this paper, we aim at recovering an unknown signal x0 from noisy measurements y = Φx0 +w, where Φ is an ill-conditioned or singular linear operator and w accounts for some noise. To regularize such an ill-posed inverse problem, we impose an analysis sparsity prior. More precisely, the recovery is cast as a convex optimization program where the objective is the sum of a quadratic data fidelity term and a regularization term formed of the `-norm of the correlations between the sought after signal and atoms in a given (generally overcomplete) dictionary. The `-sparsity analysis prior is weighted by a regularization parameter λ > 0. In this paper, we prove that any minimizers of this problem is a piecewise-affine function of the observations y and the regularization parameter λ. As a byproduct, we exploit these properties to get an objectively guided choice of λ. In particular, we develop an extension of the Generalized Stein Unbiased Risk Estimator (GSURE) and show that it is an unbiased and reliable estimator of an appropriately defined risk. The latter encompasses special cases such as the prediction risk, the projection risk and the estimation risk. We apply these risk estimators to the special case of `-sparsity analysis regularization. We also discuss implementation issues and propose fast algorithms to solve the ` analysis minimization problem and to compute the associated GSURE. We finally illustrate the applicability of our framework to parameter(s) selection on Email addresses: [email protected] (Samuel Vaiter), [email protected] (Charles-Alban Deledalle), [email protected] (Gabriel Peyré), [email protected] (Charles Dossal), [email protected] (Jalal Fadili) Preprint submitted to Applied and Computational Harmonic Analysis October 10, 2012 ha l-0 06 87 75 1, v er si on 2 10 O ct 2 01 2 several imaging problems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Large-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

متن کامل

Face Recognition using an Affine Sparse Coding approach

Sparse coding is an unsupervised method which learns a set of over-complete bases to represent data such as image and video. Sparse coding has increasing attraction for image classification applications in recent years. But in the cases where we have some similar images from different classes, such as face recognition applications, different images may be classified into the same class, and hen...

متن کامل

Sparse Regularization for High Dimensional Additive Models

We study the behavior of the l1 type of regularization for high dimensional additive models. Our results suggest remarkable similarities and differences between linear regression and additive models in high dimensional settings. In particular, our analysis indicates that, unlike in linear regression, l1 regularization does not yield optimal estimation for additive models of high dimensionality....

متن کامل

Analysis of Multi-stage Convex Relaxation for Sparse Regularization

We consider learning formulations with non-convex objective functions that often occur in practical applications. There are two approaches to this problem: • Heuristic methods such as gradient descent that only find a local minimum. A drawback of this approach is the lack of theoretical guarantee showing that the local minimum gives a good solution. • Convex relaxation such as L1-regularization...

متن کامل

Computation-Risk Tradeoffs for Covariance-Thresholded Regression

We present a family of linear regression estimators that provides a fine-grained tradeoff between statistical accuracy and computational efficiency. The estimators are based on hard thresholding of the sample covariance matrix entries together with `2-regularizion (ridge regression). We analyze the predictive risk of this family of estimators as a function of the threshold and regularization pa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012